-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Evolutionary based black-box adversarial attack. #2215
Conversation
Signed-off-by: Ali Osman TOPAL [email protected] |
Hi @aliotopal Thank you for your pull request! I'll change the target branch to the dev-branch for next release 1.16. |
Codecov Report
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. @@ Coverage Diff @@
## dev_1.16.0 #2215 +/- ##
==============================================
- Coverage 85.62% 84.88% -0.74%
==============================================
Files 324 325 +1
Lines 29323 29458 +135
Branches 5405 5413 +8
==============================================
- Hits 25108 25006 -102
- Misses 2837 3075 +238
+ Partials 1378 1377 -1
|
Hi @aliotopal Could you please take a look at the Style Checks? You can solve most of the issues by running Black formatter version |
Hi @beat-buesser, I have done the style check, it should be ok now. Can you please have a look if everything is ok? Best, |
[email protected] The style check is done. Redundant parts are removed.
Fixed trailing whitespaces, blank lines, missing values errors.
Hi @beat-buesser, |
The attack was not compatible with framework libraries, fixed and now works.
art_2 changed with art
Hi @beat-buesser, |
art/attacks/evasion/attack_EA.py
Outdated
# def _get_class_prob(preds: np.ndarray, class_no: np.array) -> np.ndarray: | ||
# ''' | ||
# :param preds: an array of predictions of individuals for all the categories: (40, 1000) shaped array | ||
# :param class_no: for the targeted attack target category index number; for the untargeted attack ancestor | ||
# category index number | ||
# :return: an array of the prediction of individuals only for the target/ancestor category: (40,) shaped array | ||
# ''' | ||
# return preds[:, class_no] |
Check notice
Code scanning / CodeQL
Commented-out code Note
Style check errors are fixed.
1. module name changed from attack_EA.py to attack_ea 2. random.shuffle() is removed, it was redundant. 3. x_ is changed to x_temp
Hi @beat-buesser, I went over the style errors, it should be fine now (I hope).
For the other errors, I don't know what to do. can you please have a look once more? |
Dear @beat-buesser, I am sorry, I know I am becoming a pain :(, but when you have time can you please have a look at our attack? |
random.choices() problem is solved.
Dear @beat-buesser, Finally, style check is giving only this error: "Class name "attack_ea" doesn't conform to PascalCase naming style (invalid-name)" I don't want to change file name, if it is ok. Besides that, there are some more errors in "mypy" but it seems they are not related to our attack. Can you please have a look and give us direction? |
Dear @beat-buesser,
Finally, style check is giving only this error: "Class name "attack_ea" doesn't conform to PascalCase naming style (invalid-name)" I don't want to change file name, if it is ok. Besides that, there are some more errors in "mypy" but it seems they are not related to our attack. Can you please have a look and give us direction?
…________________________________
From: Beat Buesser ***@***.***>
Sent: Wednesday, August 9, 2023 2:34 PM
To: Trusted-AI/adversarial-robustness-toolbox ***@***.***>
Cc: Ali Osman TOPAL ***@***.***>; Mention ***@***.***>
Subject: Re: [Trusted-AI/adversarial-robustness-toolbox] Evolutionary based black-box adversarial attack. (PR #2215)
Hi @aliotopal<https://github.com/aliotopal> Could you please take a look at the Style Checks? You can solve most of the issues by running Black formatter version black==21.12b0 with commands black --line-length 120 art/ and black --line-length 120 tests/.
—
Reply to this email directly, view it on GitHub<#2215 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AHW3H3ZMD2PYOWBMT22J3CLXUN7UVANCNFSM6AAAAAAZ6XV37M>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Hi @beat-buesser<https://github.com/beat-buesser>,
I cleared the Style Checks error. There are some error messages from mypy: warpPerspective which has nothing to do with our attack as far as I understand.
In CI Keras test: 4 jobs failed, but the error is Error uploading to https://codecov.io<https://codecov.io/>
there is nothing that I can do I guess.
Please let me know if there is something that I can do.
Best,
Ali
…________________________________
From: Beat Buesser ***@***.***>
Sent: Wednesday, August 9, 2023 2:34 PM
To: Trusted-AI/adversarial-robustness-toolbox ***@***.***>
Cc: Ali Osman TOPAL ***@***.***>; Mention ***@***.***>
Subject: Re: [Trusted-AI/adversarial-robustness-toolbox] Evolutionary based black-box adversarial attack. (PR #2215)
Hi @aliotopal<https://github.com/aliotopal> Could you please take a look at the Style Checks? You can solve most of the issues by running Black formatter version black==21.12b0 with commands black --line-length 120 art/ and black --line-length 120 tests/.
—
Reply to this email directly, view it on GitHub<#2215 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AHW3H3ZMD2PYOWBMT22J3CLXUN7UVANCNFSM6AAAAAAZ6XV37M>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Description
I have developed an evolutionary algorithm-based black-box adversarial attack (attack_EA). It is added in evasion attacks section.
Fixes # (issue)
Type of change
Please check all relevant options.
Testing
Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.
Test Configuration:
Checklist